Welcome back to Pattern Recognition and today we want to look a bit more into how to actually
apply those norms in regression problems.
So, here you see the norm dependent linear regression and you've seen that we essentially
now put up the norms that we've seen previously into the optimization problem and here we
have some matrix A and some unknown vector x and we subtract it from B and take the norm
of this problem and we can write this down as a minimization problem so the variable
that we're looking for is determined as the argument of the respective norm problem here.
Now different norms will of course lead to different results and the estimation error
epsilon that is a scalar value can be defined as the difference between the optimal regression
result and the x star which denotes the correct value.
So this then gives rise to a residual so r1, r2 up to rm where m is the number of observations
and they can then be essentially computed as the element wise deviations from our regression
problem.
So this is essentially nothing else as Ax minus B and the resulting vector gives us
the residual terms.
So if B is in the range of A the residual will be essentially a zero vector so it can
be completely projected.
Now the minimization using the two norm is something that we've already seen so there
we simply have Ax minus B and the two norm of that and this can then essentially then
be rewritten as the minimization over the residuals.
You know that the residuals can be expressed as above norm so we can write this up also
as Ax minus B transposed times Ax minus B and now we can do the math.
So I omitted this in the previous video but now you see the full solution how to actually
express that so you see that we multiply all the terms with each other.
This then gives X transpose A transpose Ax minus X transpose A transpose B minus B transpose
Ax plus B transpose B.
Now you see that we can rearrange this a little bit so there is two terms that are essentially
the same if we rearrange them so this can be written up as X transpose A transpose A
X minus 2B transposed Ax plus B transposed B.
Now if you want to have the minimization we take the partial derivative of this term with
respect to X and you see now that we can essentially write this up as 2 times A transposed Ax minus
2 times A transposed B equals to 0 and this then gives the well-known solution of the
pseudo inverse for X hat so this is A transpose A inverse A transpose B and this is of course
valid if the columns of A are mutually independent.
Well what happens if we do other norms?
Well if we do that then we see we can for example use the maximum norm and here then the result
of the norm would be the maximum over the absolute value of the respective residuals
and then this can also be rewritten into the following optimization problem so we minimize
the residuals subject to that the difference of the respective residuals is lying between
minus r times a vector of ones and the upper bound is r times a vector of ones so we are
trying to essentially shrink those boundaries as close as possible around our remaining
residuals.
Then we can also look into the minimization of the L1 norm so the L1 norm is the sum over
the absolute values over the residuals and here this you can rewrite into the minimization
problem that we have this vector of one transpose r and then this can be used as an upper and
then lower bound in the constraint optimization and again here our r is a vector in an m-dimensional
space and we have this vector of ones only.
So let's look a bit into the application of this so let's look into the ridge regression
and unit balls so we have here the minimization of Ax minus b and the two norm times lambda
the two norm of x and let's visualize this a little bit let's take the unit ball the
Presenters
Zugänglich über
Offener Zugang
Dauer
00:18:03 Min
Aufnahmedatum
2020-11-06
Hochgeladen am
2020-11-07 00:37:22
Sprache
en-US
In this video, we show the effects of norm-dependent regression.
This video is released under CC BY 4.0. Please feel free to share and reuse.
For reminders to watch the new video follow on Twitter or LinkedIn. Also, join our network for information about talks, videos, and job offers in our Facebook and LinkedIn Groups.
Music Reference: Damiano Baldoni - Thinking of You